Home |
| Latest | About | Random
# 18a Proof of the algorithm of removing redundant vectors. In the previous note, we made the claim > **Claim. An algorithm to remove redundant vectors from a spanning set.** > Given $k$ many **column vectors** $v_{1},v_{2},\ldots,v_{k}$ , form the matrix $A=\begin{bmatrix}v_{1}\ |\ v_{2}\ |\cdots| \ v_{k}\end{bmatrix}$ with those as columns. Suppose after row-reducing $A$ to an echelon from $\tilde A$, this echelon form has $r$ many pivots that occurs at columns $i_{1},i_{2},\ldots,i_{r}$. Then $$ \operatorname{span} (v_{1},v_{2},\ldots,v_{k}) = \operatorname{span} (v_{i_{1}},v_{i_{2}},\ldots,v_{i_{r}}). $$ That is to say, we can remove the columns that do not end up with a pivot in an echelon form. Our tool kit to prove this will be row-reduction to echelon forms, modifying the matrices that we are row-reducing, and carefully interpret what it means. But before we write down its proof, we illustrate a concrete scenario to better understand the proof we will write down. Suppose we have 6 column vectors $v_{1},v_{2},v_{3},v_{4},v_{5},v_{6}$, and let us form the matrix $A=\begin{bmatrix}v_{1}\ |\ v_{2}\ |\cdots| \ v_{k}\end{bmatrix}$. And suppose after row-reducing $A$ we obtain the following echelon form $$ A=\begin{bmatrix} \\ v_{1} & v_{2} & v_{3} & v_{4} & v_{5} & v_{6} \\ \ \end{bmatrix} \stackrel{\text{row}}\sim \begin{bmatrix}\colorbox{lightblue}{\(2\)} & 1 & 0 & -2 & 2 & 3 \\ 0 & \colorbox{lightblue}{\(5\)} & 3 & 7 & 1 & 0\\ 0 & 0 & 0 & \colorbox{lightblue}{\(-3\)} & 8 & 2\end{bmatrix} $$Ok, here the first vector we can remove is $v_{3}$. But why? Let us consider $v_{3}$ and the pivoted columns **before it**, namely $v_{1}$ and $v_{2}$. We claim we can write $v_{3}$ as a linear combination of $v_{1}$ and $v_{2}$. Indeed, form the **augmented matrix** and row-reduce, we get$$ \begin{bmatrix} & & \vdots \\ v_{1} & v_{2} & \vdots & v_{3} \\ \ & & \vdots\end{bmatrix} \stackrel{\text{row}}\sim \begin{bmatrix}\colorbox{lightblue}{\(2\)} & 1 & \vdots& 0 \\ 0 & \colorbox{lightblue}{\(5\)} &\vdots & 3\\ 0 & 0 & \vdots & 0\end{bmatrix}. $$We get this echelon form because we already row-reduced it from $A$! This is just "ignoring the other columns"! Since treating $v_{3}$ as the augmented column, we would not get any pivot, we see that this linear system is consistent. Hence $v_{3}$ is a linear combination of $v_{1}$ and $v_{2}$. The next vector to be removed is $v_{5}$. Indeed, look at the pivoted columns before $v_{5}$, namely $v_{1},v_{2},v_{4}$. We claim that $v_5$ is a linear combination of $v_{1},v_{2},v_{4}$. And indeed, if we set up the augmented matrix and row reduce to an echelon form, we would get $$ \begin{bmatrix} & & & \vdots \\ v_{1} & v_{2} & v_{4} & \vdots & v_{5} \\ \ & & & \vdots\end{bmatrix} \stackrel{\text{row}}\sim \begin{bmatrix}\colorbox{lightblue}{\(2\)} & 1 & -2 & \vdots& 2 \\ 0 & \colorbox{lightblue}{\(5\)} & 7 &\vdots & 1\\ 0 & 0 &\colorbox{lightblue}{\(-3\)} & \vdots & 8\end{bmatrix}, $$which again will be consistent because no pivots would show up at the augmented column if we place $v_{1},v_{2},v_{4}$ before $v_{5}$, as shown in the original row reduction of $A$. So $v_{5}$ is a linear combination of $v_{1},v_{2},v_{4}$. Lastly, the last vector to be removed here is $v_{6}$, since it will be a linear combination of $v_{1},v_{2},v_{4}$, with the same reasoning -- augmenting $v_{6}$ to $v_{1},v_{2},v_{4}$ gives $$ \begin{bmatrix} & & & \vdots \\ v_{1} & v_{2} & v_{4} & \vdots & v_{6} \\ \ & & & \vdots\end{bmatrix} \stackrel{\text{row}}\sim \begin{bmatrix}\colorbox{lightblue}{\(2\)} & 1 & -2 & \vdots& 3 \\ 0 & \colorbox{lightblue}{\(5\)} & 7 &\vdots & 0\\ 0 & 0 &\colorbox{lightblue}{\(-3\)} & \vdots & 2\end{bmatrix}, $$which is consistent again. Hence we see that **each column that end up to be a free column is a linear combination of the columns before it that end up with a pivot**, in this case $$ \begin{align*} v_{3} & \in \operatorname{span} (v_{1},v_{2}) \\ v_{5} & \in \operatorname{span} (v_{1},v_{2},v_{4}) \\ v_{6} & \in \operatorname{span} (v_{1},v_{2},v_{4}) \end{align*} $$So what this means is, any linear combination involving $v_{3},v_{5},v_{6}$ can always be written as linear combinations of $v_{1},v_{2},v_{4}$, whence we have $$ \operatorname{span} (v_{1},v_{2},\cancel{v_{3}},v_{4},\cancel{v_{5}},\cancel{v_{6}})=\operatorname{span} (v_{1},v_{2},v_{4}) $$where we keep only the vectors that end up with a pivot in an echelon form while having the same span. Here we have three pivots, so three vectors are kept. Since this number of pivots in a matrix keep showing up, we will make the following definition > **Definition.** Given a matrix $A$, we define the **rank of the matrix** $A$, written as $$ \text{rank}(A)=\text{rk}(A) $$to be the number of pivots in **an** echelon form of $A$. (Remark. This is remarkable -- because this is saying, we can determine the rank of $A$ be row reducing it to **any** echelon form.) Let us now formalize it in a proof. It will be generic with a lot of "$\ldots$" but follows the concrete scenario above. $\blacktriangleright$ Proof of the algorithm that removes redundant vectors. Let $v_{1},\ldots,v_{k}$ be $k$ column vectors where the matrix $A=\begin{bmatrix}v_{1}\ |\ v_{2}\ |\cdots| \ v_{k}\end{bmatrix}$ row-reduces to an echelon form $\tilde A$ with **pivots** on the $r$ columns $i_{1} < i_{2}< \cdots < i_{r}$ of $\tilde A$. We claim that if column $j$ **does not** have a pivot in the echelon form $\tilde A$, then $v_{j}$ is a **linear combination** of $v_{i_{1}},v_{i_{2}},\ldots ,v_{i_{m}}$, where $i_{m}$ is the largest index such that $i_{m} < j$. (So all the pivoted columns before column $j$.) Indeed, to check if $v_{j}$ is a linear combination of $v_{i_{1}},v_{i_{2}},\ldots ,v_{i_{m}}$ amounts to setting up the **augmented matrix** $$ \begin{bmatrix} & & & & \vdots\\ v_{i_{1}} & v_{i_{2}} & \cdots & v_{i_{m}} & \vdots & v_{j} \\ & & & & \vdots\end{bmatrix} $$and row-reducing it. However row reducing this augmented matrix is just the same process as row reducing $A$ to $\tilde A$, where we ignore all other columns. And in doing so, we know we will **not get a pivot** in the augmented column, as that's how it would be in $\tilde A$. Hence the augmented matrix above corresponds to a consistent linear system, meaning $v_{j}$ is a linear combination of $v_{i_{1}},v_{i_{2}},\ldots ,v_{i_{m}}$. Since this shows each column that do not end up with a pivot in the echelon form $\tilde A$ is a linear combination of the previous pivoted columns, we can remove them without changing the span. Namely, $$ \operatorname{span} (v_{1},v_{2},\ldots,v_{k}) = \operatorname{span} (v_{i_{1}},v_{i_{2}},\ldots,v_{i_{r}}) $$as claimed, where $r$ is the number of pivots of the matrix $A$ (whence its **rank**). $\blacksquare$ **Remark.** We haven't fully address this but alluded to: After removing the redundant vectors using this claim, **can we remove more** while still maintaining the same span? This I want you to think about, but it has to do with the idea of linear independence.